1,504 research outputs found
Phase-conjugate reflection by degenerate four-wave mixing in a nematic liquid crystal in the isotropic phase
We report the generation of conjugate wave fronts by degenerate four-wave mixing in the isotropic phase of the nematic substance p-methoxy-benzylidene p-n-butylaniline. The temporal and spatial properties of the conjugate wave fronts are verified. The dependence of the nonlinear reflectivity on the pump-wave power and the temperature of the medium is discussed
Reaching Approximate Byzantine Consensus with Multi-hop Communication
We address the problem of reaching consensus in the presence of Byzantine
faults. In particular, we are interested in investigating the impact of
messages relay on the network connectivity for a correct iterative approximate
Byzantine consensus algorithm to exist. The network is modeled by a simple
directed graph. We assume a node can send messages to another node that is up
to hops away via forwarding by the intermediate nodes on the routes, where
is a natural number. We characterize the necessary and
sufficient topological conditions on the network structure. The tight
conditions we found are consistent with the tight conditions identified for
, where only local communication is allowed, and are strictly weaker for
. Let denote the length of a longest path in the given network. For
and undirected graphs, our conditions hold if and only if and the node-connectivity of the given graph is at least , where
is the total number of nodes and is the maximal number of Byzantine
nodes; and for and directed graphs, our conditions is equivalent to
the tight condition found for exact Byzantine consensus.
Our sufficiency is shown by constructing a correct algorithm, wherein the
trim function is constructed based on investigating a newly introduced minimal
messages cover property. The trim function proposed also works over
multi-graphs.Comment: 24 pages, 1 figure. arXiv admin note: text overlap with
arXiv:1203.188
Dynamic Composite Data Physicalization Using Wheeled Micro-Robots
This paper introduces dynamic composite physicalizations, a new class of physical visualizations that use collections of self-propelled objects to represent data. Dynamic composite physicalizations can be used both to give physical form to well-known interactive visualization techniques, and to explore new visualizations and interaction paradigms. We first propose a design space characterizing composite physicalizations based on previous work in the fields of Information Visualization and Human Computer Interaction. We illustrate dynamic composite physicalizations in two scenarios demonstrating potential benefits for collaboration and decision making, as well as new opportunities for physical interaction. We then describe our implementation using wheeled micro-robots capable of locating themselves and sensing user input, before discussing limitations and opportunities for future work
Solving a "Hard" Problem to Approximate an "Easy" One: Heuristics for Maximum Matchings and Maximum Traveling Salesman Problems
We consider geometric instances of the Maximum Weighted Matching Problem
(MWMP) and the Maximum Traveling Salesman Problem (MTSP) with up to 3,000,000
vertices. Making use of a geometric duality relationship between MWMP, MTSP,
and the Fermat-Weber-Problem (FWP), we develop a heuristic approach that yields
in near-linear time solutions as well as upper bounds. Using various
computational tools, we get solutions within considerably less than 1% of the
optimum.
An interesting feature of our approach is that, even though an FWP is hard to
compute in theory and Edmonds' algorithm for maximum weighted matching yields a
polynomial solution for the MWMP, the practical behavior is just the opposite,
and we can solve the FWP with high accuracy in order to find a good heuristic
solution for the MWMP.Comment: 20 pages, 14 figures, Latex, to appear in Journal of Experimental
Algorithms, 200
A LC-MS/MS confirmatory method for determination of chloramphenicol in real samples screened by competitive immunoassay
A new liquid chromatography-tandem mass spectrometric (LC-MS/MS) method was developed to confirm chloramphenicol (CAP) residues in foods of animal origin and in urine samples, which were earlier found positive under the screening analysis, performed by competitive enzyme-linked immunoassay (ELISA) technique. The developed LC-MS/MS method was applied to four non-compliant samples from 2008 to 2012; giving concentrations of CAP residues from 1.18 to 3.68 μg kg−1. All samples, qualified positive by ELISA, were confirmed with the LC-MS/MS technique and found to be non-compliant. The effectiveness of the confirmatory method was proven by participating in a successful proficiency test in year 2010. Both LC-MS/MS and ELISA methods were validated according to the European Union 2002/657/EC decision. The decision limit of the confirmatory method was determined as 0.02 μg kg−1 for CAP in each validated matrix, while the detection capability of the screening test was 0.15 μg kg−1
Inter-comparison of relative stopping power estimation models for proton therapy
Theoretical stopping power values were inter-compared for the Bichsel, Janni, ICRU and Schneider relative stopping power (RSP) estimation models, for a variety of tissues and tissue substitute materials taken from the literature. The RSPs of eleven plastic tissue substitutes were measured using Bragg peak shift measurements in water in order to establish a gold standard of RSP values specific to our centre's proton beam characteristics. The theoretical tissue substitute RSP values were computed based on literature compositions to assess the four different computation approaches. The Bichsel/Janni/ICRU approaches led to mean errors in the RSP of −0.1/+0.7/−0.8%, respectively. Errors when using the Schneider approach, with I-values from the Bichsel, Janni and ICRU sources, followed the same pattern but were generally larger. Following this, the mean elemental ionisation energies were optimized until the differences between theoretical RSP values matched measurements. Failing to use optimized I-values when applying the Schneider technique to 72 human tissues could introduce errors in the RSP of up to −1.7/+1.1/−0.4% when using Bichsel/Janni/ICRU I-values, respectively. As such, it may be necessary to introduce an additional step in the current stoichiometric calibration procedure in which tissue insert RSPs are measured in a proton beam. Elemental I-values can then optimized to match these measurements, reducing the uncertainty when calculating human tissue RSPs
Engineering Art Galleries
The Art Gallery Problem is one of the most well-known problems in
Computational Geometry, with a rich history in the study of algorithms,
complexity, and variants. Recently there has been a surge in experimental work
on the problem. In this survey, we describe this work, show the chronology of
developments, and compare current algorithms, including two unpublished
versions, in an exhaustive experiment. Furthermore, we show what core
algorithmic ingredients have led to recent successes
Quantum capacity under adversarial quantum noise: arbitrarily varying quantum channels
We investigate entanglement transmission over an unknown channel in the
presence of a third party (called the adversary), which is enabled to choose
the channel from a given set of memoryless but non-stationary channels without
informing the legitimate sender and receiver about the particular choice that
he made. This channel model is called arbitrarily varying quantum channel
(AVQC). We derive a quantum version of Ahlswede's dichotomy for classical
arbitrarily varying channels. This includes a regularized formula for the
common randomness-assisted capacity for entanglement transmission of an AVQC.
Quite surprisingly and in contrast to the classical analog of the problem
involving the maximal and average error probability, we find that the capacity
for entanglement transmission of an AVQC always equals its strong subspace
transmission capacity. These results are accompanied by different notions of
symmetrizability (zero-capacity conditions) as well as by conditions for an
AVQC to have a capacity described by a single-letter formula. In he final part
of the paper the capacity of the erasure-AVQC is computed and some light shed
on the connection between AVQCs and zero-error capacities. Additionally, we
show by entirely elementary and operational arguments motivated by the theory
of AVQCs that the quantum, classical, and entanglement-assisted zero-error
capacities of quantum channels are generically zero and are discontinuous at
every positivity point.Comment: 49 pages, no figures, final version of our papers arXiv:1010.0418v2
and arXiv:1010.0418. Published "Online First" in Communications in
Mathematical Physics, 201
- …